LEET ’ 11 : 4 th USENIX Workshop on Large - Scale Exploits and Emergent Threats

نویسنده

  • Nick Mathewson
چکیده

Nick Mathewson introduced the Tor project. Tor is free open source software that enables Internet online anonymity, used by approximately 250,000 users every day. Anonymity means an attacker cannot learn who is communicating with whom in a communication network, and Tor aims to be a usable, deployable software that maximizes online anonymity. Anonymity serves different interests for different users. Anonymity is privacy for private citizens, network security for businesses, traffic-analysis resistance for governments, and reachability for human rights activists. To be anonymous, it is important to have company and hide in an anonymity network. The Tor project benefits good people more, because bad people are already doing well. Tor uses a multiple-hop relay network design. 101 This work studies to what extent FHSes guarantee privacy, and whether it is possible for an attacker to guess the URI that identifies a file and, thus, illegally download it. Nick said he studied how 100 FHSes generate their URIs, finding out that they typically generate them in a sequential way. In particular , 20 of them do not add any non-guessable information at all, making it trivial for an attacker to enumerate all the files uploaded to the service and download them. Those services that do not use sequential numbers might appear to have better security, but Nick showed that many of them use short identifiers that are easy to brute-force. After getting the URI that identifies a file, an attacker can find out whether the file has been uploaded as private. To do this, it is enough to search for the file on a search engine. If no results are returned, there is a good chance that the file is private. Nick said that 54% of the files they crawled were private. Nick said they then created some honey files that were " calling home " when opened, to see whether people crawled for them and downloaded them. These files were downloaded 275 times, by more than 80 unique IP addresses. This is the proof that there is a problem of people looking for private files on FHSes. To check whether the intentions of these people are malicious, Nick created a fake credit card trading Web site and put information about it, and the credentials to log into it, in some of the honey files. In total, they logged 93 logins in the fake Web site, by 43 different IP addresses. This …

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

LEET ’ 13 : 6 th USENIX Workshop on Large - Scale Exploits and Emergent Threats

Luca began his presentation by outlining his goal of changing vulnerability risk assessment methodology to something that is more Internet scale to allow better quantitative assessment of vulnerability. However, the methodology does not work for targeted attacks against specific organizations, but does work for widespread untargeted attacks. CVSS doesn’t do a great job of identifying high use a...

متن کامل

Observations on Emerging Threats

Trend Micro's Threat Research group is specially tasked with looking forward on the threat landscape and working with technology and/or various product development groups inside the company to ensure that, as a company, we deliver the appropriate security solutions to address emerging threats to our customers. To accomplish this requires our threat research group to understand, explore, and dec...

متن کامل

An Ant Colony Optimization Algorithm for Network Vulnerability Analysis

Intruders often combine exploits against multiple vulnerabilities in order to break into the system. Each attack scenario is a sequence of exploits launched by an intruder that leads to an undesirable state such as access to a database, service disruption, etc. The collection of possible attack scenarios in a computer network can be represented by a directed graph, called network attack gra...

متن کامل

FastReplica: Efficient Large File Distribution Within Content Delivery Networks

In this work, we consider a large-scale distributed network of servers and a problem of content distribution across it. We propose a novel algorithm, called FastReplica, for an efficient and reliable replication of large files in the Internet environment. There are a few basic ideas exploited in FastReplica. In order to replicate a large file among n nodes (n is in the range of 10-30 nodes), th...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011